https://arab.news/62tam
- Facebook was slammed for enabling sexist job advertising on the platform and breaking equality law in the way that it handles ads
- Reportedly, 96 percent of the people shown an ad for mechanic jobs on Facebook were men and 95 percent of those shown an ad for nursery nurse jobs were women
LONDON: Facebook was slammed on Thursday for enabling sexist job advertising on the platform and breaking equality law in the way that it handles ads.
Global Witness, a UK-based campaign group, accused Facebook of failing to prevent discriminatory targeting of ads, which they claim is a result of biased algorithms that determine who will see the ads.
The group conducted an investigation into how Facebook determines who views which ads.
Results indicated that 96 percent of the people shown an ad for mechanic jobs on Facebook were men and 95 percent of those shown an ad for nursery nurse jobs were women.
Similarly, 75 percent of those shown an ad for pilot jobs were men and 77 percent of those shown an ad for psychologist jobs were women.
To corroborate the results, Global Witness conducted their own experiments to determine to what level Facebook’s ads were discriminatory.
In one experiment, the group submitted two job ads for approval and asked Facebook not to show one to women and the other to anyone over the age of 55.
Facebook approved both ads for publication but requested that the organization tick a box saying it would not discriminate against these groups.
As the ads were a part of the experiment, both were pulled offline before publication.
In response, a Facebook spokesperson said: “Our system takes into account different kinds of information to try and serve people ads they will be most interested in and we are reviewing the findings within this report.”
“We’ve been exploring expanding limitations on targeting options for job, housing and credit ads to other regions beyond the US and Canada, and plan to have an update in the coming weeks.”
After submitting the investigation results to Shona Jolly, QC, the barrister authorized their submission to the UK Equality and Human Rights Commission for further investigation.
Facebook’s algorithms caused a stir recently when they led to a “primates” label being placed on a video of black men.
The video, which was shared on the platform last week, featured clips of black men in altercations with white civilians and police officers.
The social media giant apologized for the error and claimed that while the platform is making changes to its algorithms, the platform still relies on AI methods that are “not perfect.”
“While we have made improvements to our AI, we know it’s not perfect and we have more progress to make. We apologize to anyone who may have seen these offensive recommendations,” the Facebook spokesperson said.